skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Harris, T"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Verification of program safety is often reducible to proving the unsatisfiability (i.e., validity) of a formula in Satisfiability Modulo Theories (SMT): Boolean logic combined with theories that formalize arbitrary first-order fragments. Zeroknowledge (ZK) proofs allow SMT formulas to be validated without revealing the underlying formulas or their proofs to other parties, which is a crucial building block for proving the safety of proprietary programs. Recently, Luo et al. studied the simpler problem of proving the unsatisfiability of pure Boolean formulas but does not support proofs generated by SMT solvers. This work presents ZKSMT, a novel framework for proving the validity of SMT formulas in ZK. We design a virtual machine (VM) tailored to efficiently represent the verification process of SMT validity proofs in ZK. Our VM can support the vast majority of popular theories when proving program safety while being complete and sound. To demonstrate this, we instantiate the commonly used theories of equality and linear integer arithmetic in our VM with theory-specific optimizations for proving them in ZK. ZKSMT achieves high practicality even when running on realistic SMT formulas generated by Boogie, a common tool for software verification. It achieves a three-order-of-magnitude improvement compared to a baseline that executes the proof verification code in a general ZK system. 
    more » « less
  2. This article outlines the key components of the River’s Edge Construction lesson plan. An explanation of how the lesson was delivered is presented alongside suggestions for implementation by K–6 teachers. The integration of scientific literacy is discussed first, followed by a discussion of each of the 5Es (Bybee et al. 2006). A timeframe for distributing the lesson phases is given; however, the activities included in this plan (see Supplementary Resources for specific lesson materials), should be modified to meet the needs and interest of students, and to align with allotted instructional time and objectives. 
    more » « less
  3. This article outlines the key components of the River’s Edge Construction lesson plan. An explanation of how the lesson was delivered is presented alongside suggestions for implementation by K–6 teachers. The integration of scientific literacy is discussed first, followed by a discussion of each of the 5Es (Bybee et al. 2006). A timeframe for distributing the lesson phases is given; however, the activities included in this plan (see Supplementary Resources for specific lesson materials), should be modified to meet the needs and interest of students, and to align with allotted instructional time and objectives. 
    more » « less
  4. null (Ed.)
    omparing the spatial characteristics of spatiotemporal random fields is often at demand. However, the comparison can be challenging due to the high-dimensional feature and dependency in the data. We develop a new multiple testing approach to detect local differences in the spatial characteristics of two spatiotemporal random fields by taking the spatial information into account. Our method adopts a twocomponent mixture model for location wise p-values and then derives a new false discovery rate (FDR) control, called mirror procedure, to determine the optimal rejection region. This procedure is robust to model misspecification and allows for weak dependency among hypotheses. To integrate the spatial heterogeneity, we model the mixture probability as well as study the benefit if any of allowing the alternative distribution to be spatially varying. AnEM-algorithm is developed to estimate the mixture model and implement the FDR procedure. We study the FDR control and the power of our new approach both theoretically and numerically, and apply the approach to compare the mean and teleconnection pattern between two synthetic climate fields. Supplementary materials for this article are available online. 
    more » « less
  5. Climate field reconstructions (CFRs) attempt to estimate spatiotemporal fields of climate variables in the past using climate proxies such as tree rings, ice cores, and corals. Data assimilation (DA) methods are a recent and promising new means of deriving CFRs that optimally fuse climate proxies with climate model output. Despite the growing application of DA-based CFRs, little is understood about how much the assimilated proxies change the statistical properties of the climate model data. To address this question, we propose a robust and computationally efficient method, based on functional data depth, to evaluate differences in the distributions of two spatiotemporal processes. We apply our test to study global and regional proxy influence in DA-based CFRs by comparing the background and analysis states, which are treated as two samples of spatiotemporal fields.We find that the analysis states are significantly altered from the climate-model-based background states due to the assimilation of proxies. Moreover, the difference between the analysis and background states increases with the number of proxies, even in regions far beyond proxy collection sites. Our approach allows us to characterize the added value of proxies, indicating where and when the analysis states are distinct from the background states. Supplementary materials for this article are available online. 
    more » « less
  6. null (Ed.)
    The use of radical bridging ligands to facilitate strong magnetic exchange between paramagnetic metal centers represents a key step toward the realization of single-molecule magnets with high operating temperatures. Moreover, bridging ligands that allow the incorporation of high-anisotropy metal ions are particularly advantageous. Toward these ends, we report the synthesis and detailed characterization of the dinuclear hydroquinone-bridged complexes [(Me 6 tren) 2 MII2(C 6 H 4 O 2 2− )] 2+ (Me 6 tren = tris(2-dimethylaminoethyl)amine; M = Fe, Co, Ni) and their one-electron-oxidized, semiquinone-bridged analogues [(Me 6 tren) 2 MII2(C 6 H 4 O 2 − ˙)] 3+ . Single-crystal X-ray diffraction shows that the Me 6 tren ligand restrains the metal centers in a trigonal bipyramidal geometry, and coordination of the bridging hydro- or semiquinone ligand results in a parallel alignment of the three-fold axes. We quantify the p -benzosemiquinone–transition metal magnetic exchange coupling for the first time and find that the nickel( ii ) complex exhibits a substantial J < −600 cm −1 , resulting in a well-isolated S = 3/2 ground state even as high as 300 K. The iron and cobalt complexes feature metal–semiquinone exchange constants of J = −144(1) and −252(2) cm −1 , respectively, which are substantially larger in magnitude than those reported for related bis(bidentate) semiquinoid complexes. Finally, the semiquinone-bridged cobalt and nickel complexes exhibit field-induced slow magnetic relaxation, with relaxation barriers of U eff = 22 and 46 cm −1 , respectively. Remarkably, the Orbach relaxation observed for the Ni complex is in stark contrast to the fast processes that dominate relaxation in related mononuclear Ni II complexes, thus demonstrating that strong magnetic coupling can engender slow magnetic relaxation. 
    more » « less
  7. A<sc>bstract</sc> A search for the decay$$ {B}_c^{+} $$ B c + → χc1(3872)π+is reported using proton-proton collision data collected with the LHCb detector between 2011 and 2018 at centre-of-mass energies of 7, 8, and 13 TeV, corresponding to an integrated luminosity of 9 fb−1. No significant signal is observed. Using the decay$$ {B}_c^{+} $$ B c + →ψ(2S)π+as a normalisation channel, an upper limit for the ratio of branching fractions$$ {\mathcal{R}}_{\psi (2S)}^{\chi_{c1}(3872)}=\frac{{\mathcal{B}}_{B_c^{+}\to {\chi}_{c1}(3872){\pi}^{+}}}{{\mathcal{B}}_{B_c^{+}\to \psi (2S){\pi}^{+}}}\times \frac{{\mathcal{B}}_{\chi_{c1}(3872)\to J/\psi {\pi}^{+}{\pi}^{-}}}{{\mathcal{B}}_{\psi (2S)\to J/\psi {\pi}^{+}{\pi}^{-}}}<0.05(0.06), $$ R ψ 2 S χ c 1 3872 = B B c + χ c 1 3872 π + B B c + ψ 2 S π + × B χ c 1 3872 J / ψ π + π B ψ 2 S J / ψ π + π < 0.05 0.06 , is set at the 90 (95)% confidence level. 
    more » « less
    Free, publicly-accessible full text available June 1, 2026
  8. The branching fraction of the decay B + ψ ( 2 S ) ϕ ( 1020 ) K + , relative to the topologically similar decay B + J / ψ ϕ ( 1020 ) K + , is measured using proton-proton collision data collected by the LHCb experiment at center-of-mass energies of 7, 8, and 13 TeV, corresponding to an integrated luminosity of 9 fb 1 . The ratio is found to be 0.061 ± 0.004 ± 0.009 , where the first uncertainty is statistical and the second systematic. Using the world-average branching fraction for B + J / ψ ϕ ( 1020 ) K + , the branching fraction for the decay B + ψ ( 2 S ) ϕ ( 1020 ) K + is found to be ( 3.0 ± 0.2 ± 0.5 ± 0.2 ) × 10 6 , where the first uncertainty is statistical, the second systematic, and the third is due to the branching fraction of the normalization channel. © 2025 CERN, for the LHCb Collaboration2025CERN 
    more » « less
    Free, publicly-accessible full text available May 1, 2026